Tweet
Login
Mathematics Crystal
You may switch between
tex
and
pdf
by changing the end of the URL.
Home
About Us
Materials
Site Map
Questions and Answers
Skills
Topic Notes
HSC
Integration
Others
Tangent
UBC
UNSW
Calculus Advanced
Challenges
Complex Numbers
Conics
Differentiation
Integration
Linear Algebra
Mathematical Induction
Motion
Others
Polynomial Functions
Probability
Sequences and Series
Trigonometry
/
Topics /
Probability /
Probability Basics.tex
--Quick Links--
The Number Empire
Wolfram Mathematica online integrator
FooPlot
Calc Matthen
Walter Zorn
Quick Math
Lists of integrals
List of integrals of trigonometric functions
PDF
\documentclass[10pt]{article} \usepackage{amssymb,amsmath} \usepackage[hmargin=1cm,vmargin=1cm]{geometry} \begin{document} {\large Probability} \begin{align*} \text{\bf Possible }&\text{\bf Outcomes of a Repeatable Experiment}\\ &\text{If an experiment can be performed repetitively under the same conditions, some outcomes are bound to happen}\\ &\text{more often and some less so. Probability, or \it chance\rm, is the ratio of the number of ``successful'' outcomes}\\ &\text{(statistically) to the number of \it all \rm ``possible'' outcomes.}\\ \\ &\text{Fundamental Formula:}\quad\boxed{\frac{\text{Number of successful outcomes}}{\text{Number of all possible outcomes}}}\\ \\ &\text{In this context, no experiments need to be performed. We assume all ``simple'' outcomes (like drawing a particular}\\ &\text{ball from a bag) are equally probable. The mathematics here is to calculate the probability of ``complex'' events}\\ &\text{from the probability of ``simple'' events based on some rules.}\\ \\ &\text{Also assumed is that all elements are ``distinguishable'' for the purpose of statistical experiments. For example,}\\ &\text{there are ten ``distinguishable' ways to draw a ball out of a bag of ten, even though all balls are identical.}\\ \\ \text{\bf Set }&\text{\bf Theory Basics:}\\ &\text{Set notations used here:}\quad\text{$\mathcal P(A)$: Power set - set of all subsets of $A$;}\quad\text{$S$: Universal set;}\quad\text{$\emptyset$ (or \{~\}): Empty set.}\\ &\text{$A^c$: Complement of $A$;}\quad \text{$|A|$: Number of elements in $A$;}\quad |A^c|=|S|-|A|;\quad |A\cup B|=|A|+|B|-|A\cap B|.\\ &\text{$A$ and $B$ are \bf disjoint \rm iff $A\cap B=\emptyset$.}\quad \text{Disjoint sets $A_r (r=1,2,\ldots,k)$ \bf partition \rm $B$ iff $\bigcup_{r=1}^k A_r=B$.}\\ \\ \text{\bf Defini}&\text{\bf tions:}\\ &\text{\bf Sample space: \rm Set of all possible outcomes.}\quad \text{\bf Event: \rm Subset of a sample space.}\\ &\text{\bf Probability: \rm A real function $P$ on $\mathcal P(S)$ that satisfies:}\\ &\quad\text{(a) For all $A\subseteq S,~~0\le P(A)\le 1$~~~(Probability is always between 0 and 1 inclusively);}\\ &\quad\text{(b) $P(\emptyset)=0$~~~(Nothing is impossible);}\\ &\quad\text{(c) $P(S)=1$~~~(All are certain);}\\ &\quad\text{(d) $A\cap B=\emptyset\Rightarrow P(A\cup B)=P(A)+P(B)$}\\ &\quad\quad\quad\text{(The probability of either one of two mutually exclusive events equals to the sum of their probabilities).}\\ \\ &\text{From the above definition, the following rules can be derived for a finite sample space $S$:}\\ &\quad P(A)=\sum_{a\in A}P(\{a\}).\quad\text{Probability is the sum of the probabilities of (mutually exclusive) elements.}\\ &\quad P(A)=\frac{|A|}{|S|},~\text{if $P(\{a\})$ is constant for all $a\in S$.}\quad\text{Probability is the ratio of successes over possibles.}\\ &\quad\sum_{a\in S}P(\{a\})=1.\quad\text{All are certain.}\\ % &\text{Also:}\quad P(A\cup B)=P(A)+P(B)-P(A\cap B);\quad P(A^c)=1-P(A);\quad\text{If $A\subseteq B$, then $P(A)\le P(B)$.}\\ % &\text{\bf Conditional probability \rm of $A$ \bf given \rm $B$:}\quad P(A|B)=\frac{P(A\cap B)}{P(B)},\quad\text{where }P(B)\ne 0.\\ % &\text{This derives the \bf Multiplication Rule \rm:}\quad P(A\cap B)=P(A|B)P(B)=P(B|A)P(A).\\ \end{align*} \begin{align*} \text{\bf Parti}&\text{\bf tioning:}\quad\text{Given $A_1,\ldots,A_n$ partition $S$\quad\ldots}\\ &\text{\bf Total Probability Rule: \rm If $B$ is an event ($B\subseteq S$), then}\quad P(B)=\sum_{i=1}^n P(B|A_i)P(A_i).\\ &\text{\bf Bayes' Rule: \rm If $B$ is an event ($B\subseteq S$), then}\quad P(A_j|B)=\frac{P(B|A_j)P(A_j)}{\sum_{i=1}^n P(B|A_i)P(A_i)}.\\ \\ \text{\bf Statis}&\text{\bf tical Independence:}\\ &\text{Events $A$ and $B$ are \bf (statistically) independent \rm iff}\quad P(A\cap B)=P(A)P(B).\\ &\text{Events $A_1,\ldots,A_n$ are \bf mutually independent \rm iff for any }A_{i_1},\ldots,A_{i_k}, P(A_{i_1}\cap\ldots\cap A_{i_k})=P(A_{i_1})\times\ldots\times P(A_{i_k}).\\ % % %\text{Sample Space:}\quad %&\text{The set of all permutations.}\\ %\\ %&\text{e.g. For 3 distinct elements $A$, $B$, and $C$, there are 6 permutations, which forms the sample space}\\ %&S=\{ABC, ACB, BCA, BAC, CAB, CBA\}\text{. Its cardinality (number of members) is }|S|=3!=6\:.\\ \end{align*} \end{document}